Sequential Normalization: Embracing Smaller Sample Sizes for Normalization
نویسندگان
چکیده
Normalization as a layer within neural networks has over the years demonstrated its effectiveness in network optimization across wide range of different tasks, with one most successful approaches being that batch normalization. The consensus is better estimates BatchNorm normalization statistics (μ and σ2) each mini-batch result optimization. In this work, we challenge belief experiment new variant known GhostNorm that, despite independently normalizing batches mini-batches, i.e., μ σ2 are computed applied to groups samples mini-batch, outperforms consistently. Next, introduce sequential (SeqNorm), application above type two dimensions input, find models trained SeqNorm consistently outperform or on multiple image classification data sets. Our contributions follows: (i) uncover source regularization unique GhostNorm, not simply an extension from BatchNorm, illustrate effects loss landscape, (ii) (SeqNorm) improves (iii) compare both against alone well other techniques, (iv) for models, train whose performance than our baselines, including ones standard sets CIFAR–10, CIFAR-100, ImageNet ((+0.2%, +0.7%, +0.4%), (+0.3%, +1.7%, +1.1%) SeqNorm, respectively).
منابع مشابه
Development and Normalization of Emotion Regulation Strategies about Germophobia Questionnaire: A Pilot Study in Iranian Sample
Objective: The COVID-19 has recently identified as a pandemic by World Health Organization. The outbreak of the disease has caused many people around the world to become extremely frightened and they show phobic signs. Fear is basic emotion of anxiety disorders and individuals cope with their emotions by different strategies. The purpose of present study was to developing and normalizing of emo...
متن کاملLook-ahead sequential feature vector normalization for noisy speech recognition
Cepstral mean subtraction (CMS), which is a simple long-term bias removal, is used to compensate for transmission and linear xed channel e ects. In order to process the non-linear channel, a two-level CMS was proposed where separate channel compensation is performed for segments that are classi ed as speech and for segments classied as background. In this paper, methods for extending the two-le...
متن کاملWeak βη-Normalization and Normalization by Evaluation for System F
A general version of the fundamental theorem for System F is presented which can be instantiated to obtain proofs of weak βand βη-normalization and normalization by evaluation.
متن کاملAutomatically Extracting Variant-Normalization Pairs for Japanese Text Normalization
Social media texts, such as tweets from Twitter, contain many types of nonstandard tokens, and the number of normalization approaches for handling such noisy text has been increasing. We present a method for automatically extracting pairs of a variant word and its normal form from unsegmented text on the basis of a pair-wise similarity approach. We incorporated the acquired variant-normalizatio...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Information
سال: 2022
ISSN: ['2078-2489']
DOI: https://doi.org/10.3390/info13070337